Corrections to “Generalization Bounds via Information Density and Conditional Information Density”

نویسندگان

چکیده

An error in the proof of data-dependent tail bounds on generalization presented Hellström and Durisi (2020) is identified, a correction proposed. Furthermore, we note that absolute continuity requirements need to be strengthened avoid measurability issues.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Separability of Density Matrices and Conditional Information Transmission

We give necessary and sufficient conditions under which a density matrix acting on a two-fold tensor product space is separable. Our conditions are given in terms of quantum conditional information transmission.

متن کامل

Conditional Density Estimation via Least-Squares Density Ratio Estimation

Estimating the conditional mean of an inputoutput relation is the goal of regression. However, regression analysis is not sufficiently informative if the conditional distribution has multi-modality, is highly asymmetric, or contains heteroscedastic noise. In such scenarios, estimating the conditional distribution itself would be more useful. In this paper, we propose a novel method of condition...

متن کامل

Consistency and Generalization Bounds for Maximum Entropy Density Estimation

We investigate the statistical properties of maximum entropy density estimation, both for the complete data case and the incomplete data case. We show that under certain assumptions, the generalization error can be bounded in terms of the complexity of the underlying feature functions. This allows us to establish the universal consistency of maximum entropy density estimation.

متن کامل

Information Geometric Density Estimation

We investigate kernel density estimation where the kernel function varies from point to point. Density estimation in the input space means to find a set of coordinates on a statistical manifold. This novel perspective helps to combine efforts from information geometry and machine learning to spawn a family of density estimators. We present example models with simulations. We discuss the princip...

متن کامل

Information Density and Syntactic Repetition

In noun phrase (NP) coordinate constructions (e.g., NP and NP), there is a strong tendency for the syntactic structure of the second conjunct to match that of the first; the second conjunct in such constructions is therefore low in syntactic information. The theory of uniform information density predicts that low-information syntactic constructions will be counterbalanced by high information in...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE journal on selected areas in information theory

سال: 2021

ISSN: ['2641-8770']

DOI: https://doi.org/10.1109/jsait.2021.3088240